Modify Leave-One-Out Cross Validation by Moving Validation Samples around Random Normal Distributions: Move-One-Away Cross Validation
نویسندگان
چکیده
منابع مشابه
On The Value of Leave-One-Out Cross-Validation Bounds
A long-standing problem in classification is the determination of the regularization parameter. Nearly every classification algorithm uses a parameter (or set of parameters) to control classifier complexity. Crossvalidation on the training set is usually done to determine the regularization parameter(s). [1] proved a leave-one-out cross-validation (LOOCV) bound for a class of kernel classifiers...
متن کاملFast exact leave-one-out cross-validation of sparse least-squares support vector machines
Leave-one-out cross-validation has been shown to give an almost unbiased estimator of the generalisation properties of statistical models, and therefore provides a sensible criterion for model selection and comparison. In this paper we show that exact leave-one-out cross-validation of sparse Least-Squares Support Vector Machines (LS-SVMs) can be implemented with a computational complexity of on...
متن کاملBayesian Leave-One-Out Cross-Validation Approximations for Gaussian Latent Variable Models
The future predictive performance of a Bayesian model can be estimated using Bayesian cross-validation. In this article, we consider Gaussian latent variable models where the integration over the latent values is approximated using the Laplace method or expectation propagation (EP). We study the properties of several Bayesian leave-one-out (LOO) crossvalidation approximations that in most cases...
متن کاملFeature Scaling for Kernel Fisher Discriminant Analysis Using Leave-One-Out Cross Validation
Kernel fisher discriminant analysis (KFD) is a successful approach to classification. It is well known that the key challenge in KFD lies in the selection of free parameters such as kernel parameters and regularization parameters. Here we focus on the feature-scaling kernel where each feature individually associates with a scaling factor. A novel algorithm, named FS-KFD, is developed to tune th...
متن کاملE cient leave-one-out cross-validation of kernel Fisher discriminant classi'ers
Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher’s linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classi'ers can be implemented with a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied Sciences
سال: 2020
ISSN: 2076-3417
DOI: 10.3390/app10072448